Market Roundup May 28, 2004 IBM Announces Next Gen SAN File System Reap What You Know: EMC and HP Deliver
New Solutions for SMBs |
|
IBM Announces Next Gen SAN File System
IBM has the next generation of its storage virtualization
software which includes support for IBM and non-IBM systems to help simplify
the management and complexity of mixed storage environments. The IBM TotalStorage SAN File System (SFS) V2.1, a network-based
heterogeneous file system that provides a single, centralized point of control
to manage files and databases, has been enhanced to support IBM and non-IBM
Storage Area Network (SAN) attached storage devices, including devices from
EMC, HP, and Hitachi. The software is designed without inherent limitations on
the amount of storage that can be supported. According to IBM, the software is
designed to help reduce the overall amount of storage required in a customer's
infrastructure by both allowing storage resources to be shared more efficiently
across servers and reducing the need for duplicate copies of files. The company
said the SAN File System has the ability to help clients improve application
availability by reducing downtime for storage and data management tasks, and
also features a number of availability and security enhancements, including
improvements to the administrative user interface, that make data easier to
manage. Additional host operating system support includes Red Hat Linux
Enterprise Server 3.0 and Sun Solaris 9. The IBM TotalStorage
SAN File System Version 2.1 will be available on June 29, 2004. No pricing
information was included in the announcement.
Pursuing visionary goals is tough work on the best of
days. While many want to reside in a shining city on a hill, most ignore the
fact that the road there is paved with deep ruts, hard rocks, and sharp nails,
as well as the occasional precipitous drop-off into a fiery pit. However, to
keep spirits up and convince bystanders that one remains dedicated, it is
common enough to report on the incremental steps of the journey. The problem is
that while such efforts allow supporters to cheer one’s achievements, they also
provide one’s detractors a chance to note how long the journey is taking or how
far a distance is left to travel. IBM’s enhancement of its SFS offers a case in
point. Originally discussed as part of the company’s now defunct StorageTank initiative, SFS now plays a critical role
(along with the SAN Volume Controller and the TotalStorage
Productivity Center) in IBM’s Virtualization Engine Suite for Storage, whose
goal is to allow enterprises to easily, centrally, and efficiently manage all
the data stored in heterogeneous IT infrastructures, regardless of which
vendors’ devices are involved. The fact is that while most storage vendors are
headed toward this same laudable goal, getting there is no easy task, requiring
countless incremental steps (and sometimes missteps) along the way.
With that in mind, where does IBM’s SFS V2.1 leave the company? It is important to remember that the original version of SFS announced last October worked only with IBM products, which is the typical beginning of such efforts. From that standpoint alone, SFS V2.1 has come quite a distance, supporting SAN-attached devices from EMC, HDS, and HP, as well as IBM, and delivering host OS support for Red Hat Linux and Solaris. In addition, the new software-only package includes functional, security, and high-availability enhancements not found in the first release. Perhaps most important of all, these new features were delivered in the timeframe IBM originally established. In other words, the company’s trip to storage virtualization appears to be on track and on schedule. That said, while SFS V2.1 offers significantly enhanced performance, much work remains. Expanded OS support will be critical to delivering a solution of interest to the greater market, as is establishing the role SFS and related IBM virtualization solutions can and will play in strategic efforts including information lifecycle management (ILM) and increasingly influential IT solutions such as grid computing. The SAN File System V2.1 offers the company and its allies a good deal to cheer. IBM’s vision of storage virtualization remains a distant goal, but that shining city on the hill is measurably closer than it was before.
Reap What You Know: EMC and HP Deliver New
Solutions for SMBs
This week EMC introduced the CLARiiON AX100, which the
company said allows customers to move from internal server-based storage to
cost-effective external DAS, SAN, or NAS. Available in a 2U rack-mounted
enclosure, the AX100 incorporates technologies from EMC’s
CLARiiON CX product family, but utilizes affordable ATA disk solutions. Fibre Channel switch and host bus adapter (HBA) products
from Brocade, Emulex, and Qlogic
allow the AX100 to be deployed as an entry-level SAN solution, and the AX100
can also be used as the direct-attached array for EMC’s
new NetWin 110 NAS system. The AX100 is available
from EMC’s worldwide network of channel partners,
resellers, and OEMs including Dell (as the Dell/EMC AX100), Tech Data,
Fujitsu-Siemens (as the FibreCAT AX100), Arrow
Electronics Inc., Avnet Hall-Mark, Bell Micro, MTI, Ideal, CDW, OpenWay, Samsung (available as the StorageMax
ZCX 100), Skydata, Bull, Source, and others. The
AX100 supports Windows, Linux, and NetWare, can scale up to 3 terabytes of
capacity, and is available at prices starting under $5,999. In an unrelated
announcement, HP announced immediate availability of the StorageWorks NAS 1200s
and 2000s, which the company described as the first solutions to consolidate
Exchange Server 2003 data stores on Windows Storage Server 2003-based NAS
systems. The company also announced an agreement with CommVault
to provide SMBs running Windows-based HP NAS devices with an integrated
software suite for data management and protection. The HP StorageWorks NAS
2000s and 1200s with support for Microsoft's Exchange Server 2003 Feature Pack
are available with base pricing beginning at about $5,800 for the 2000s model
and $2,495 for the 1200s.
Though notably different in performance and intent, both EMC’s and HP’s new SMB products offer similar lessons in
how IT vendors can leverage existing strengths to strengthen or explore new
market efforts. In the case of HP, the StorageWorks NAS 1200s/2000s provide the
company a pair of inflection points to press their “Easy as NAS” program, a
joint effort with Microsoft to accelerate both HP NAS sales and the adoption of
Windows Storage Server 2003. By facilitating the consolidation of file, print,
and Exchange Server 2003 data stores on the new systems, the companies are
pushing the notion of reducing costs and complexity by improving operational
efficiency. This is not a particularly revolutionary notion in IT circles, but
it could resonate among cost-conscious SMBs, especially those who rely on
Microsoft’s Windows Server solutions to drive their business processes. The new
products could also be a boon for HP’s StorageWorks products, which suffered
notable declines in the company’s recent earnings announcement. Given its
ongoing struggle with Dell in the desktop and IA-32-based server markets, HP
needs every boost it can to exhibit or improve its lot among SMBs, and this
concerted co-marketing effort could pay dividends in those efforts.
EMC’s AX100 follows a somewhat similar path to an altogether different destination, leveraging the company’s well established strengths in SAN technology for a new market. To be sure, the AX100 also carries the fingerprints of the company’s partnership with Dell, and allows both companies to deliver entry-level SANs with notable price/performance. For Dell, this means new sales opportunities for its myriad existing SMB customers. For EMC it means providing its enterprise-focused OEM and channel partners affordable entry points to SMB customers. The ability to deploy the AX100 either as a free-standing SMB SAN or, with the addition of the NetWin 110, as a SAN-friendly NAS solution demonstrates EMC’s approach to leveraging inhouse skills and assets. It is interesting that the AX100 has drawn scorn from some in the storage world including SAN aficionados who suggest its use of ATA technology sullies a technology best supported by Fibre Channel drives. To a degree, these arguments remind us of RISC proponents who decried IA-32-based servers as being inappropriate for enterprise applications. The fact is that evolutionary change in the IT industry tends to bubble up from below rather than fall sweetly from above. The increasing popularity of ATA technologies in enterprise-focused products such as EMC’s Centera CAS solutions suggests that fundamental, commodity-based changes in the storage industry are well and appropriately under way.
IBM has announced a new program in which ISVs can access
IBM technology — both software and hardware — through an IBM Virtual Innovation
Center to help build, sell, and deploy solutions for the mid-tier market. The
Virtual Innovation Center allows ISVs to remotely access IBM hardware and
software through a secure Web portal and provides them access to one or more
servers for up to fourteen days. The access is via IBM grid technology, and
allows for specific testing and development environments to be created for the
ISVs using server partitioning and virtualization. As a result, ISVs do not
need to actually have IBM hardware and software on site, but instead can access
their specific environment from anywhere on the planet. The IBM Virtual
Innovation Center is part of the company’s larger $500 million effort to
capture SMB market share through its various business partners and ISVs.
One suspects that many or even most IBMers
feel a little like the kid in a candy store. IBM offers a wide array of
technology that no other vendor can match at this time, and the company is
showing a distinct predilection toward leveraging its product portfolio across
as wide an array of markets as possible. In IBM’s Virtual Innovation Center, we
see the company using field-tested grid solutions to create an entirely new
delivery model for business partners who, in turn, will help IBM capture more
market share in the SMB space. That is what you call leverage.
We have noted before IBM’s ongoing effort to woo the SMB market, in part by delivering a wide array of solution sets that support the efforts of its business partners and ISVs. We believe that strategy is sound, especially when one considers that most SMBs buy their IT inventory from third party providers who own the relationship with the customer and in many cases provide specific vertical market expertise. By using its grid computing technology to deliver yet another service to assist these business partners, IBM gets a two-fer. First, the Virtual Innovation Center offers a platform for demonstrating the capabilities grid computing applications offer real world businesses, which should help define and add tangibility to the otherwise murky idea many people have of what grid computing actually is. Offering such a resource to its partners should allow IBM myriad opportunities to effectively evangelize grid computing as an integral part of the company’s larger On Demand computing paradigm, and do so in a way that allows actions to speak louder than mere marketing collateral. In addition, the Virtual Innovation Center allows IBM to offer its SMB-directed business partners a new, more seamless way to deliver SMB-targeted products based on IBM technology. If that helps speed the delivery of new offerings to the SMB market, it will help IBM partners meet their customers’ needs in ever shorter development cycles. In a market as driven by the need for vendor responsiveness and inhibited by slender margins as the SMB sector, helping its partners respond quickly and cost-effectively to customer demands is a strategy that benefits every member of the value chain, including IBM.
More Data Not Necessarily Means Better
Data
The Department of Homeland Security is in the final stages
of awarding a $15 billion contract to create what officials call a “virtual
border” to track visitors to the United States. Using a network of databases,
the department hopes to be able to track the millions of visitors to this
country each year who enter the country through some
300 points of entry at airports, borders and sea ports. The system will connect
approximately twenty different federal databases which will allow immigration
officials to check visitors in realtime for authorization and confirmation of
their identity at the various points of entry. The system will then purportedly
be able to track visitors during their visits to the U.S. and determine if they
have obeyed various immigration laws or if they are engaging in suspicious
activity. The program is scheduled to be rolled out over the next decade, and
will possibly use a number of different identification means, such as
fingerprinting, iris scans, or some other kind of biometric system to identify
the some 300 million people who visit the country each year.
Will this system work? Will it be worth $15 billion? Will
it actually make the country safer? Perhaps, but we suspect there is a bit of wishful
thinking at work here. First of all, the idea of this virtual border smacks of
an over-optimistic reliance on technology to solve what amounts to a human
problem. One thing is very sure, however, and that is that this system will
generate huge amounts of data. Will it all be accurate? Certainly the databases
involved will not be able to determine so. Instead, for this system to work,
humans are going to have to play a key role in providing the final judgments on
whether or not an individual is dangerous, or just an odd statistical outlier.
We also have to wonder if in fact the government is a bit behind the private sector in making reasonable evaluations of technology and its purported benefits. As much of the private sector learned in a painful way during the go-go 1990s, technology itself is not an end all and be all. IT and data are only as good as their ability to place relevant information in the hands of people when they need it, an idea that fuels much of the skepticism that is now found in private sector IT evaluations. What the private sector learned, simply, is that if it sounds too good to be true, it probably is. That said, the fact remains that IT currency demands that ongoing purchases and upgrades of the enterprise IT footprint continue. In selling their wares, IT vendors must ensure they can describe and deliver realistic value propositions to their customers, ones that make their lives — and access to critical information — simpler, not more complex.